8,532 research outputs found

    Treatment of the background error in the statistical analysis of Poisson processes

    Get PDF
    The formalism that allows to take into account the error sigma_b of the expected mean background b in the statistical analysis of a Poisson process with the frequentistic method is presented. It is shown that the error sigma_b cannot be neglected if it is not much smaller than sqrt(b). The resulting confidence belt is larger that the one for sigma_b=0, leading to larger confidence intervals for the mean mu of signal events.Comment: 15 pages including 2 figures, RevTeX. Final version published in Phys. Rev. D 59 (1999) 11300

    Comment on "Including Systematic Uncertainties in Confidence Interval Construction for Poisson Statistics"

    Get PDF
    The incorporation of systematic uncertainties into confidence interval calculations has been addressed recently in a paper by Conrad et al. (Physical Review D 67 (2003) 012002). In their work, systematic uncertainities in detector efficiencies and background flux predictions were incorporated following the hybrid frequentist-Bayesian prescription of Cousins and Highland, but using the likelihood ratio ordering of Feldman and Cousins in order to produce "unified" confidence intervals. In general, the resulting intervals behaved as one would intuitively expect, i.e. increased with increasing uncertainties. However, it was noted that for numbers of observed events less than or of order of the expected background, the intervals could sometimes behave in a completely counter-intuitive fashion -- being seen to initially decrease in the face of increasing uncertainties, but only for the case of increasing signal efficiency uncertainty. In this comment, we show that the problematic behaviour is due to integration over the signal efficiency uncertainty while maximising the best fit alternative hypothesis likelihood. If the alternative hypothesis likelihood is determined by unconditionally maximising with respect to both the unknown signal and signal efficiency uncertainty, the limits display the correct intuitive behaviour.Comment: Submitted to Physical Review

    Access Anytime Anyplace: An Empircal Investigation of Patterns of Technology Use in Nomadic Computing Environments

    Get PDF
    With the increasing pervasiveness of mobile technologies such as cellular phones, personal digital assistants and hand held computers, mobile technologies promise the next major technological and cultural shift. Like the Internet, it is predicted that the greatest impact will not come from hardware devices or software programs, but from emerging social practices, which were not possible before. To capitalize on the benefits of mobile technologies, organizations have begun to implement nomadic computing environments. Nomadic computing environments make available the systems support needed to provide computing and communication capabilities and services to the mobile work force as they move from place to place in a manner that is transparent, integrated, convenient and adaptive. Already, anecdotes suggest that within organizations there are social implications occurring with both unintended and intended consequences being perpetuated. The problems of nomadic computing users have widely been described in terms of the challenges presented by the interplay of time, space and context, yet a theory has yet to be developed which analyzes this interplay in a single effort. A temporal human agency perspective proposes that stakeholders’ actions are influenced by their ability to recall the past, respond to the present and imagine the future. By extending the temporal human agency perspective through the recognition of the combined influence of space and context on human action, I investigated how the individual practices of eleven nomadic computing users changed after implementation. Under the umbrella of the interpretive paradigm, and using a cross case methodology this research develops a theoretical account of how several stakeholders engaged with different nomadic computing environments and explores the context of their effectiveness. Applying a literal and theoretical replication strategy to multiple longitudinal and retrospective cases, six months were spent in the field interviewing and observing participants. Data analysis included three types of coding: descriptive, interpretive and pattern coding. The findings reveal that patterns of technology use in nomadic computing environments are influenced by stakeholders’ temporal orientations; their ability to remember the past, imagine the future and respond to the present. As stakeholders all have different temporal orientations and experiences, they exhibit different practices even when engaging initially with the same organizational and technical environments. Opposing forces emerge as users attempt to be effective by resolving the benefits and disadvantages of the environment as they undergo different temporal, contextual and spatial experiences. Insights about the ability to predict future use suggest that because they are difficult to envisage in advance, social processes inhibit the predictability of what technologies users will adopt. The framework presented highlights the need to focus on understanding the diversity in nomadic computing use practices by examining how they are influenced by individual circumstances as well as shared meanings across individuals

    Unbiased cut selection for optimal upper limits in neutrino detectors: the model rejection potential technique

    Full text link
    We present a method for optimising experimental cuts in order to place the strongest constraints (upper limits) on theoretical signal models. The method relies only on signal and background expectations derived from Monte-Carlo simulations, so no bias is introduced by looking at actual data, for instance by setting a limit based on expected signal above the ``last remaining data event.'' After discussing the concept of the ``average upper limit,'' based on the expectation from an ensemble of repeated experiments with no true signal, we show how the best model rejection potential is achieved by optimising the cuts to minimise the ratio of this ``average upper limit'' to the expected signal from the model. As an example, we use this technique to determine the limit sensitivity of kilometre scale neutrino detectors to extra-terrestrial neutrino fluxes from a variety of models, e.g. active galaxies and gamma-ray bursts. We suggest that these model rejection potential optimised limits be used as a standard method of comparing the sensitivity of proposed neutrino detectors.Comment: 18 pages, 7 figures, submitted to Astroparticle Physic

    Relations of Mercury to other Metals

    Get PDF
    n/

    Setting UBVRI Photometric Zero-Points Using Sloan Digital Sky Survey ugriz Magnitudes

    Get PDF
    We discuss the use of Sloan Digital Sky Survey (SDSS) ugriz point-spread function (PSF) photometry for setting the zero points of UBVRI CCD images. From a comparison with the Landolt (1992) standards and our own photometry we find that there is a fairly abrupt change in B, V, R, & I zero points around g, r, i ~ 14.5, and in the U zero point at u ~ 16. These changes correspond to where there is significant interpolation due to saturation in the SDSS PSF fluxes. There also seems to be another, much smaller systematic effect for stars with g, r > 19.5. The latter effect is consistent with a small Malmquist bias. Because of the difficulties with PSF fluxes of brighter stars, we recommend that comparisons of ugriz and UBVRI photometry should only be made for unsaturated stars with g, r and i in the range 14.5 - 19.5, and u in the range 16 - 19.5. We give a prescription for setting the UBVRI zero points for CCD images, and general equations for transforming from ugriz to UBVRI.Comment: 13 pages. 6 figures. Accepted for publication in the Astronomical Journa

    Including Systematic Uncertainties in Confidence Interval Construction for Poisson Statistics

    Get PDF
    One way to incorporate systematic uncertainties into the calculation of confidence intervals is by integrating over probability density functions parametrizing the uncertainties. In this note we present a development of this method which takes into account uncertainties in the prediction of background processes, uncertainties in the signal detection efficiency and background efficiency and allows for a correlation between the signal and background detection efficiencies. We implement this method with the Feldman & Cousins unified approach with and without conditioning. We present studies of coverage for the Feldman & Cousins and Neyman ordering schemes. In particular, we present two different types of coverage tests for the case where systematic uncertainties are included. To illustrate the method we show the relative effect of including systematic uncertainties the case of dark matter search as performed by modern neutrino tel escopes.Comment: 23 pages, 10 figures, replaced to match published versio

    Spectral identification and quantification of salts in the Atacama Desert

    Get PDF
    This work was part-funded by a Research Incentive Grant from The Carnegie Trust (REF: 70335) and a Royal Society of Edinburgh Research Fellowship to C. Cousins. J, Harris acknowledges funding from STFC (consolidated grant ST/N000528/1).Salt minerals are an important natural resource. The ability to quickly and remotely identify and quantify salt deposits and salt contaminated soils and sands is therefore a priority goal for the various industries and agencies that utilise salts. The advent of global hyperspectral imagery from instruments such as Hyperion on NASA’s Earth-Observing 1 satellite has opened up a new source of data that can potentially be used for just this task. This study aims to assess the ability of Visible and Near Infrared (VNIR) spectroscopy to identify and quantify salt minerals through the use of spectral mixture analysis. The surface and near-surface soils of the Atacama Desert in Chile contain a variety of well-studied salts, which together with low cloud coverage, and high aridity, makes this region an ideal testbed for this technique. Two forms of spectral data ranging 0.35 – 2.5 ÎŒm were collected: laboratory spectra acquired using an ASD FieldSpec Pro instrument on samples from four locations in the Atacama desert known to have surface concentrations of sulfates, nitrates, chlorides and perchlorates; and images from the EO-1 satellite’s Hyperion instrument taken over the same four locations. Mineral identifications and abundances were confirmed using quantitative XRD of the physical samples. Spectral endmembers were extracted from within the laboratory and Hyperion spectral datasets and together with additional spectral library endmembers fed into a linear mixture model. The resulting identification and abundances from both dataset types were verified against the sample XRD values. Issues of spectral scale, SNR and how different mineral spectra interact are considered, and the utility of VNIR spectroscopy and Hyperion in particular for mapping specific salt concentrations in desert environments is established. Overall, SMA was successful at estimating abundances of sulfate minerals, particularly calcium sulfate, from both hyperspectral image and laboratory sample spectra, while abundance estimation of other salt phase spectral end-members was achieved with a higher degree of error.Publisher PD
    • 

    corecore